Demystifying Relational Latent Representations
نویسندگان
چکیده
Latent features learned by deep learning approaches have proven to be a powerful tool for machine learning. They serve as a data abstraction that makes learning easier by capturing regularities in dataion that makes learning easier by capturing regularities in data explicitly. Their benefits motivated their adaptation to relational learning context. In our previous work, we introduce an approach that learns relational latent features by means of clustering instances and their relations. The major drawback of latent representations is that they are often black-box and difficult to interpret. This work addresses these issues and shows that (1) latent features created by clustering are interpretable and capture interesting properties of data; (2) they identify local regions of instances that match well with the label, which partially explains their benefit; and (3) although the number of latent features generated by this approach is large, often many of them are highly redundant and can be removed without hurting performance much.
منابع مشابه
KBLRN : End-to-End Learning of Knowledge Base Representations with Latent, Relational, and Numerical Features
We present KBLRN, a novel framework for end-to-end learning of knowledge base representations from latent, relational, and numerical features. We discuss the advantages of each of the three feature types and the benefits of their combination. To the best of our knowledge, KBLRN is the first machine learning approach that learns representations of knowledge bases by integrating latent, relationa...
متن کاملBayesian factorization of joint categorical distributions for relational data and classical conditioning models
We explore the problem of infering latent structure in the joint probability of categorical variables by factorizing them into a latent representation for categories and a weight matrix that encodes a PMF, mapping these latent representations to the mass associated with seeing categories appear together. The prior for latent category representations is either the Chinese Restaurant Process (CRP...
متن کاملAnalogical Inference for Multi-relational Embeddings
Large-scale multi-relational embedding refers to the task of learning the latent representations for entities and relations in large knowledge graphs. An effective and scalable solution for this problem is crucial for the true success of knowledgebased inference in a broad range of applications. This paper proposes a novel framework for optimizing the latent representations with respect to the ...
متن کاملSparse Relational Topic Models for Document Networks
Learning latent representations is playing a pivotal role in machine learning and many application areas. Previous work on the relational topic model (RTM) has shown promise on learning latent topical representations for describing relational document networks and predicting pairwise links. However under a probabilistic formulation with normalization constraints, RTM could be ineffective in con...
متن کاملTowards Holistic Concept Representations: Embedding Relational Knowledge, Visual Attributes, and Distributional Word Semantics
Knowledge Graphs (KGs) effectively capture explicit relational knowledge about individual entities. However, visual attributes of those entities, like their shape and color and pragmatic aspects concerning their usage in natural language are not covered. Recent approaches encode such knowledge by learning latent representations (‘embeddings’) separately: In computer vision, visual object featur...
متن کامل